Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Instruction-following pretraining
# Instruction-following pretraining
Ahma 7B
Apache-2.0
Ahma-7B is a 7-billion-parameter decoder-only Transformer model based on Meta Llama(v1) architecture, fully pretrained from scratch using Finnish language.
Large Language Model
Transformers
Other
A
Finnish-NLP
201
8
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase